Hands-On Learning Theory Fall 2017, Lecture 6

نویسنده

  • Jean Honorio
چکیده

Recall that in Theorem 2.1, we analyzed empirical risk minimization with a finite hypothesis class F , i.e., |F| < +∞. Here as in Theorem 4.1, we will prove results for a possibly infinite hypothesis class F . We will make a generalization with respect to the previous results. In this lecture, we have z ∈ Z and we use a hypothesis (∀h ∈ F) h : Z → R. This setting is more general than for prediction problems as in Theorems 2.1 and 4.1, but it should be clear than previous results can be generalized as well. For prediction, we have pairs (x, y) and we try to predict y ∈ Y from x ∈ X by using functions (∀f ∈ F) f : X → Y. Then, we used a distortion function d : Y × Y → [0, 1] and defined the risk in terms of d(y, f(x)). Let z = (x, y) and thus Z = X × Y. We can define h(z) = d(y, f(x)) and obtain the prediction problem. Next, we present several definitions that will naturally come later from our analysis in Theorem 6.1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hands-On Learning Theory Fall 2016, Lecture 3

A basic property of the entropy of a discrete random variable x is that: 0 ≤ H(x) ≤ log |X | In fact, the entropy is maximal for the discrete uniform distribution. That is, (∀x ∈ X ) p(x) = 1/|X |, in which case H(x) = log |X |. Definition 3.2 (Conditional entropy). The conditional entropy of y given x is defined as: H(y|x) = ∑ v∈X px(v)H(y|x = v) = − ∑ v∈X px(v) ∑ y∈Y py|x(y|v) log py|x(y|v) =...

متن کامل

Hands-On Learning Theory Fall 2017, Lecture 4

Recall that in Theorem 2.1, we analyzed empirical risk minimization with a finite hypothesis class F , i.e., |F| < +∞. Here, we will prove results for possibly infinite hypothesis classes. Although the PAC-Bayes framework is far more general, we will concentrate of the prediction problem as before, i.e., (∀f ∈ F) f : X → Y. Also, note that Theorem 2.1 could have been stated in a more general fa...

متن کامل

MIT 6 . 829 : Computer Networks Fall 2017 Lecture 18 Video

This lecture is about different techniques used for video streaming to improve performance and Quality of Experience (QoE). Specifically, we learned about ”neural adaptive video streaming with Pensieve” [1]. Pensieve adaptively selects bitrates (ABR) for future video chunks based on observations collected by client video players. It automatically learns ABR algorithms that adapt to a wide range...

متن کامل

Hands-On Learning Theory Fall 2016, Lecture 4

Recall that in Theorem 2.1, we analyzed empirical risk minimization with a finite hypothesis class F , i.e., |F| < +∞. Here, we will prove results for possibly infinite hypothesis classes. Although the PAC-Bayes framework is far more general, we will concentrate of the prediction problem as before, i.e., (∀f ∈ F) f : X → Y. Also, note that Theorem 2.1 could have been stated in a more general fa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017